676 research outputs found

    Type-Based Termination, Inflationary Fixed-Points, and Mixed Inductive-Coinductive Types

    Full text link
    Type systems certify program properties in a compositional way. From a bigger program one can abstract out a part and certify the properties of the resulting abstract program by just using the type of the part that was abstracted away. Termination and productivity are non-trivial yet desired program properties, and several type systems have been put forward that guarantee termination, compositionally. These type systems are intimately connected to the definition of least and greatest fixed-points by ordinal iteration. While most type systems use conventional iteration, we consider inflationary iteration in this article. We demonstrate how this leads to a more principled type system, with recursion based on well-founded induction. The type system has a prototypical implementation, MiniAgda, and we show in particular how it certifies productivity of corecursive and mixed recursive-corecursive functions.Comment: In Proceedings FICS 2012, arXiv:1202.317

    Normalization by Evaluation in the Delay Monad: A Case Study for Coinduction via Copatterns and Sized Types

    Get PDF
    In this paper, we present an Agda formalization of a normalizer for simply-typed lambda terms. The normalizer consists of two coinductively defined functions in the delay monad: One is a standard evaluator of lambda terms to closures, the other a type-directed reifier from values to eta-long beta-normal forms. Their composition, normalization-by-evaluation, is shown to be a total function a posteriori, using a standard logical-relations argument. The successful formalization serves as a proof-of-concept for coinductive programming and reasoning using sized types and copatterns, a new and presently experimental feature of Agda.Comment: In Proceedings MSFP 2014, arXiv:1406.153

    On the lattice of normal subgroups in ultraproducts of compact simple groups

    Full text link
    We prove that the lattice of normal subgroups of ultraproducts of compact simple non-abelian groups is distributive. In the case of ultraproducts of finite simple groups or compact connected simple Lie groups of bounded rank the set of normal subgroups is shown to be linearly ordered by inclusion.Comment: 33 pages, no figures; v3 minor revision, to appear in PLM

    Automatic Generation of Models of Microarchitectures

    Get PDF
    Detailed microarchitectural models are necessary to predict, explain, or optimize the performance of software running on modern microprocessors. Building such models often requires a significant manual effort, as the documentation provided by hardware manufacturers is typically not precise enough. The goal of this thesis is to develop techniques for generating microarchitectural models automatically. In the first part, we focus on recent x86 microarchitectures. We implement a tool to accurately evaluate small microbenchmarks using hardware performance counters. We then describe techniques to automatically generate microbenchmarks for measuring the performance of individual instructions and for characterizing cache architectures. We apply our implementations to more than a dozen different microarchitectures. In the second part of the thesis, we study more general techniques to obtain models of hardware components. In particular, we propose the concept of gray-box learning, and we develop a learning algorithm for Mealy machines that exploits prior knowledge about the system to be learned. Finally, we show how this algorithm can be adapted to minimize incompletely specified Mealy machines—a well-known NP-complete problem. Our implementation outperforms existing exact minimization techniques by several orders of magnitude on a number of hard benchmarks; it is even competitive with state-of-the-art heuristic approaches.Zur Vorhersage, Erklärung oder Optimierung der Leistung von Software auf modernen Mikroprozessoren werden detaillierte Modelle der verwendeten Mikroarchitekturen benötigt. Das Erstellen derartiger Modelle ist oft mit einem hohen Aufwand verbunden, da die erforderlichen Informationen von den Prozessorherstellern typischerweise nicht zur Verfügung gestellt werden. Das Ziel der vorliegenden Arbeit ist es, Techniken zu entwickeln, um derartige Modelle automatisch zu erzeugen. Im ersten Teil beschäftigen wir uns mit aktuellen x86-Mikroarchitekturen. Wir entwickeln zuerst ein Tool, das kleine Microbenchmarks mithilfe von Performance Countern auswerten kann. Danach beschreiben wir Techniken, um automatisch Microbenchmarks zu erzeugen, mit denen die Leistung einzelner Instruktionen gemessen sowie die Cache-Architektur charakterisiert werden kann. Im zweiten Teil der Arbeit betrachten wir allgemeinere Techniken, um Hardwaremodelle zu erzeugen. Wir schlagen das Konzept des “Gray-Box Learning” vor, und wir entwickeln einen Lernalgorithmus für Mealy-Maschinen, der bekannte Informationen über das zu lernende System berücksichtigt. Zum Abschluss zeigen wir, wie dieser Algorithmus auf das Problem der Minimierung unvollständig spezifizierter Mealy-Maschinen übertragen werden kann. Hierbei handelt es sich um ein bekanntes NP-vollständiges Problem. Unsere Implementierung ist in mehreren Benchmarks um Größenordnungen schneller als vorherige Ansätze

    On Model-Theoretic Strong Normalization for Truth-Table Natural Deduction

    Get PDF

    Well-Founded Recursion over Contextual Objects

    Get PDF
    We present a core programming language that supports writing well-founded structurally recursive functions using simultaneous pattern matching on contextual LF objects and contexts. The main technical tool is a coverage checking algorithm that also generates valid recursive calls. To establish consistency, we define a call-by-value small-step semantics and prove that every well-typed program terminates using a reducibility semantics. Based on the presented methodology we have implemented a totality checker as part of the programming and proof environment Beluga where it can be used to establish that a total Beluga program corresponds to a proof

    On Irrelevance and Algorithmic Equality in Predicative Type Theory

    Full text link
    Dependently typed programs contain an excessive amount of static terms which are necessary to please the type checker but irrelevant for computation. To separate static and dynamic code, several static analyses and type systems have been put forward. We consider Pfenning's type theory with irrelevant quantification which is compatible with a type-based notion of equality that respects eta-laws. We extend Pfenning's theory to universes and large eliminations and develop its meta-theory. Subject reduction, normalization and consistency are obtained by a Kripke model over the typed equality judgement. Finally, a type-directed equality algorithm is described whose completeness is proven by a second Kripke model.Comment: 36 pages, superseds the FoSSaCS 2011 paper of the first author, titled "Irrelevance in Type Theory with a Heterogeneous Equality Judgement
    • …
    corecore